Skip to content

Conversation

ihower
Copy link
Contributor

@ihower ihower commented Oct 4, 2025

Based on prior work in #732, the Local Shell Tool was nearly complete but still didn’t work in practice. This PR finishes the logic, adds tests, and provides a working example.

Changes

  1. Return tool output to the LLM

    The local command ran, but its output was not returned to the LLM. This PR sends the output back, as described in the Local Shell guide: https://platform.openai.com/docs/guides/tools-local-shell

  2. Use call_id

    Upstream (openai-python) LocalShellCallOutput uses id, but the server actually expects call_id. This PR sets call_id so the server accepts the output. A small type-check ignore workaround is included until the upstream type is fixed.

  3. Add examples/tools/local_shell.py a working example.

  4. Add Tests

Before fix

You would get:

openai.BadRequestError: Error code: 400 - {'error': {'message': 'No tool output found for local shell call call_MEmrSaOboO5WG3MXGazSubb7.', 'type': 'invalid_request_error', 'param': 'input', 'code': None}}

and

Error getting response: Error code: 400 - {'error': {'message': "Missing required parameter: 'input[3].call_id'.", 'type': 'invalid_request_error', 'param': 'input[3].call_id', 'code': 'missing_required_parameter'}}. (request_id: req_xxx)

After fix

The example runs successfully and returns the expected result.

@seratch seratch added bug Something isn't working feature:core labels Oct 5, 2025
@seratch
Copy link
Member

seratch commented Oct 5, 2025

Thanks for sending this. It looks good. I will check after DevDay!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feature:core
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants